Repository navigation
gpt2
- Website
- Wikipedia
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
中文nlp解决方案(大模型、数据、模型、训练、推理)
GPT2 for Chinese chitchat/用于中文闲聊的GPT2模型(实现了DialoGPT的MMI思想)
RoBERTa中文预训练模型: RoBERTa for Chinese
a fast and user-friendly runtime for transformer inference (Bert, Albert, GPT2, Decoders, etc) on CPU and GPU.
Chinese NewsTitle Generation Project by GPT2.带有超级详细注释的中文GPT2新闻标题生成项目。
Simple Text-Generator with OpenAI gpt-2 Pytorch Implementation
Adan: Adaptive Nesterov Momentum Algorithm for Faster Optimizing Deep Models
Pre-trained Transformers for Arabic Language Understanding and Generation (Arabic BERT, Arabic GPT2, Arabic ELECTRA)
A PyTorch implementation of "Graph Wavelet Neural Network" (ICLR 2019)
NLP 领域常见任务的实现,包括新词发现、以及基于pytorch的词向量、中文文本分类、实体识别、摘要文本生成、句子相似度判断、三元组抽取、预训练模型等。
MindSpore online courses: Step into LLM
Guide: Finetune GPT2-XL (1.5 Billion Parameters) and finetune GPT-NEO (2.7 B) on a single GPU with Huggingface Transformers using DeepSpeed
PyTorch Implementation of OpenAI GPT-2