Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
-
Updated
Sep 29, 2024 - Rust
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
Pre-trained Chinese ELECTRA(中文ELECTRA预训练模型)
Pre-trained Transformers for Arabic Language Understanding and Generation (Arabic BERT, Arabic GPT2, Arabic ELECTRA)
Pretrained ELECTRA Model for Korean
NLP 领域常见任务的实现,包括新词发现、以及基于pytorch的词向量、中文文本分类、实体识别、摘要文本生成、句子相似度判断、三元组抽取、预训练模型等。
Turkish BERT/DistilBERT, ELECTRA and ConvBERT models
Pretrain and finetune ELECTRA with fastai and huggingface. (Results of the paper replicated !)
🤗 Korean Comments ELECTRA: 한국어 댓글로 학습한 ELECTRA 모델
Build and train state-of-the-art natural language processing models using BERT
Pytorch-Named-Entity-Recognition-with-transformers
AI and Memory Wall
DBMDZ BERT, DistilBERT, ELECTRA, GPT-2 and ConvBERT models
“英特尔创新大师杯”深度学习挑战赛 赛道2:CCKS2021中文NLP地址要素解析
中文 预训练 ELECTRA 模型: 基于对抗学习 pretrain Chinese Model
ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators
Turkish-Reading-Comprehension-Question-Answering-Dataset
Baseline code for Korean open domain question answering(ODQA)
Electra pre-trained model using Vietnamese corpus
基于bert4keras的GLUE基准代码
Add a description, image, and links to the electra topic page so that developers can more easily learn about it.
To associate your repository with the electra topic, visit your repo's landing page and select "manage topics."