This repository has been archived by the owner on Jan 15, 2024. It is now read-only.
v0.8.1
News
- GluonNLP was featured in KDD 2019 Alaska! Check out our tutorial: From Shallow to Deep Language Representations: Pre-training, Fine-tuning, and Beyond.
- GluonNLP 0.8.1 will no longer support Python 2. (#721, #838)
- Interested in BERT int8 quantization for deployment? Check out the blog post here.
Models and Scripts
RoBERTa
- The RoBERTa model introduced by Yinhan Liu, et. al in "RoBERTa: A Robustly Optimized BERT Pretraining Approach". The model checkpoints are converted from the original repository. Check out the usage here. (#870)
Transformer-XL
- The Transformer-XL model introduced by Zihang Dai, et. al in "Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context". (#846)
Bug Fixes
- Fixed hybridization for the BERT model (#877)
- Change the variable model to bert_classifier (#828) thank you @LindenLiu
- Revert "Add axis argument to squeeze()" (#857)
- [BUGFIX] Remove incorrect vocab.padding_token requirement in CorpusBPTTBatchify
- [BUGFIX] Fix Vocab with unknown_token remapped to != 0 via token_to_idx arg (#862)
- [BUGFIX] Fix AMP in finetune_classifier.py (#848)
- [BUGFIX] fix broken multi-head attention cell (#878) @ZiyueHuang
- [FIX] fix chnsenticorp dataset download link (#873)
- fix the usage of pad in bert (#850)
Documentation
- Clarify Bert does not require MXNet nightly anymore (#860)
- [DOC] fix broken links (#833)
- [DOC] Update BERT index.rst (#844)
- [DOC] Add GluonCV/NLP archive (#823)
- [DOC] add missing dataset document (#832)
- [DOC] remove wrong tutorial header level (#826)
- [DOC] Fix a typo in attention_cell's docstring (#841) thank you @shenfei
- [DOC] Upgrade mxnet dependency to 1.5.0 and use Cuda 10.1 on CI (#842)
- Remove Py2 icon from Readme. Add 3.7 (#856)
- [DOC] Improve help message (#855) thank you @apeforest
- Update index.rst (#853)
- [DOC] Fix Machine Translation with Transformers example (#865)
- update button style (#869)
- [DOC] doc fix for vocab.subwords (#885) thank you @liusy182