Skip to content

Some demo seq2seq models implemented with pytorch, including rnn, cnn, attention, and transformer.

Notifications You must be signed in to change notification settings

wenhaofang/Seq2SeqDemo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Seq2Seq Demo

This repository includes some demo Seq2Seq models.

Note: The project refers to https://github.com/bentrevett/pytorch-seq2seq


Datasets:

Models:

Process Data

PYTHONPATH=. python dataprocess/process.py

Unit Test

  • For loader
# loader1
PYTHONPATH=. python loaders/loader1.py
# loader2
PYTHONPATH=. python loaders/loader2.py
  • For module
# module1
PYTHONPATH=. python modules/module1.py
# module2
PYTHONPATH=. python modules/module2.py
# module3
PYTHONPATH=. python modules/module3.py
# module4
PYTHONPATH=. python modules/module4.py
# module5
PYTHONPATH=. python modules/module5.py
# module6
PYTHONPATH=. python modules/module6.py

Main Process

python main.py

You can change the config either in the command line or in the file utils/parser.py

Here are the examples for each module:

# module1
python main.py \
    --module 1 \
    --grad_clip 1 \
    --rnn_type lstm \
    --enc_emb_dim 256 \
    --dec_emb_dim 256 \
    --enc_hid_dim 512 \
    --dec_hid_dim 512 \
    --enc_n_layers 2 \
    --dec_n_layers 2 \
    --enc_n_directions 1 \
    --dec_n_directions 1 \
    --enc_dropout 0.5 \
    --dec_dropout 0.5
# module2
python main.py \
    --module 2 \
    --grad_clip 1 \
    --rnn_type gru \
    --enc_emb_dim 256 \
    --dec_emb_dim 256 \
    --enc_hid_dim 512 \
    --dec_hid_dim 512 \
    --enc_n_layers 1 \
    --dec_n_layers 1 \
    --enc_n_directions 1 \
    --dec_n_directions 1 \
    --enc_dropout 0.5 \
    --dec_dropout 0.5
# module3
python main.py \
    --module 3 \
    --grad_clip 1 \
    --rnn_type gru \
    --enc_emb_dim 256 \
    --dec_emb_dim 256 \
    --enc_hid_dim 512 \
    --dec_hid_dim 512 \
    --enc_n_layers 1 \
    --dec_n_layers 1 \
    --enc_n_directions 2 \
    --dec_n_directions 1 \
    --enc_dropout 0.5 \
    --dec_dropout 0.5
# module4
python main.py \
    --module 4 \
    --grad_clip 1 \
    --rnn_type gru \
    --enc_emb_dim 256 \
    --dec_emb_dim 256 \
    --enc_hid_dim 512 \
    --dec_hid_dim 512 \
    --enc_n_layers 1 \
    --dec_n_layers 1 \
    --enc_n_directions 2 \
    --dec_n_directions 1 \
    --enc_dropout 0.5 \
    --dec_dropout 0.5
# module5
python main.py \
    --module 5 \
    --grad_clip 0.1 \
    --enc_emb_dim 256 \
    --dec_emb_dim 256 \
    --enc_hid_dim 512 \
    --dec_hid_dim 512 \
    --enc_filter_layers 10 \
    --dec_filter_layers 10 \
    --enc_kernel_size 3 \
    --dec_kernel_size 3 \
    --enc_dropout 0.25 \
    --dec_dropout 0.25
# module6
python main.py \
    --module 6 \
    --grad_clip 1 \
    --enc_hid_dim 256 \
    --dec_hid_dim 256 \
    --enc_transformer_layers 3 \
    --dec_transformer_layers 3 \
    --enc_attention_heads 8 \
    --dec_attention_heads 8 \
    --enc_mid_dim 512 \
    --dec_mid_dim 512 \
    --enc_dropout 0.1 \
    --dec_dropout 0.1

About

Some demo seq2seq models implemented with pytorch, including rnn, cnn, attention, and transformer.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages