Repository navigation
electra
- Website
- Wikipedia
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
Pre-trained Chinese ELECTRA(中文ELECTRA预训练模型)
Pre-trained Transformers for Arabic Language Understanding and Generation (Arabic BERT, Arabic GPT2, Arabic ELECTRA)
Pretrained ELECTRA Model for Korean
NLP 领域常见任务的实现,包括新词发现、以及基于pytorch的词向量、中文文本分类、实体识别、摘要文本生成、句子相似度判断、三元组抽取、预训练模型等。
Pretrain and finetune ELECTRA with fastai and huggingface. (Results of the paper replicated !)
🤗 Korean Comments ELECTRA: 한국어 댓글로 학습한 ELECTRA 모델
Build and train state-of-the-art natural language processing models using BERT
AI and Memory Wall
Pytorch-Named-Entity-Recognition-with-transformers
DBMDZ BERT, DistilBERT, ELECTRA, GPT-2 and ConvBERT models
中文 预训练 ELECTRA 模型: 基于对抗学习 pretrain Chinese Model
ELECTRA: Pre-training Text Encoders as Discriminators Rather Than Generators
Turkish-Reading-Comprehension-Question-Answering-Dataset
Baseline code for Korean open domain question answering(ODQA)
Electra pre-trained model using Vietnamese corpus
基于bert4keras的GLUE基准代码