Skip to content

Commit 6329645

Browse files
godweiyangTaka152
andauthored
remove fairseq cli and modules from examples (bytedance#234)
* remove fairseq cli and modules from examples * modify all cli import paths * add cli for lightseq-deepspeed, delete useless __init__.py * fix bug for test_ls_ops (bert encoder layer) * format files * remove examples from build * remove examples cpp from cmakelist * update version to 2.2.1 * Update CMakeLists.txt recover cpp example build Co-authored-by: Ying Xiong <[email protected]>
1 parent 65d87b2 commit 6329645

25 files changed

+42
-38
lines changed

MANIFEST.in

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -3,3 +3,4 @@ global-include *.cu *.cpp *.cc *.cuh *.h *.ldscript *.proto *.cmake
33
prune dist
44
prune build
55
prune tests
6+
prune examples

examples/inference/__init__.py

Whitespace-only changes.

examples/inference/python/__init__.py

Whitespace-only changes.

examples/inference/python/export/__init__.py

Whitespace-only changes.

examples/inference/python/test/__init__.py

Whitespace-only changes.

examples/training/__init__.py

Whitespace-only changes.

examples/training/deepspeed/ds_fairseq.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -8,8 +8,8 @@
88
from fairseq import tasks, distributed_utils
99
from fairseq.logging import metrics
1010

11-
from examples.training.deepspeed.ds_fairseq_data import BatchIterator
12-
from examples.training.deepspeed.ds_fairseq_argument import gen_ds_fairseq_arg
11+
from ds_fairseq_data import BatchIterator
12+
from ds_fairseq_argument import gen_ds_fairseq_arg
1313

1414

1515
best_bleu = 0.0

examples/training/deepspeed/ds_fairseq_wmt14en2de.sh

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -9,9 +9,8 @@ if [ ! -d "/tmp/wmt14_en_de" ]; then
99
tar -zxvf /tmp/databin_wmt14_en_de.tar.gz -C /tmp && rm /tmp/databin_wmt14_en_de.tar.gz
1010
fi
1111

12-
deepspeed ${THIS_DIR}/ds_fairseq.py \
12+
lightseq-deepspeed ${THIS_DIR}/ds_fairseq.py \
1313
/tmp/wmt14_en_de/ \
14-
--user-dir ${THIS_DIR}/../fairseq/fs_modules \
1514
--arch ls_transformer_wmt_en_de_big_t2t --share-decoder-input-output-embed \
1615
--optimizer ls_adam --adam-betas '(0.9, 0.98)' --clip-norm 0.0 \
1716
--lr 5e-4 --lr-scheduler inverse_sqrt --warmup-updates 4000 \

examples/training/fairseq/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,7 @@ sh examples/training/fairseq/ls_fairseq_wmt14en2de.sh
1313
```
1414

1515
Or you can use LightSeq modules like `--arch ls_transformer_wmt_en_de_big_t2t`,
16-
by adding `--user-dir=${LIGHTSEQ_DIR}/examples/training/fairseq/fs_modules`
16+
by adding `--user-dir=${LIGHTSEQ_DIR}/lightseq/training/cli/fs_modules`
1717
to `fairseq-train`.
1818

1919
This script firstly download the dataset and then run native Fairseq

examples/training/fairseq/__init__.py

Whitespace-only changes.

0 commit comments

Comments
 (0)