Repository navigation
bart
- Website
- Wikipedia
LightSeq: A High Performance Library for Sequence Processing and Generation
Open Source Pre-training Model Framework in PyTorch & Pre-trained Model Zoo
Rust native ready-to-use NLP pipelines and transformer-based models (BERT, DistilBERT, GPT2,...)
Self-contained Machine Learning and Natural Language Processing library in Go
Tencent Pre-training framework in PyTorch & Pre-trained Model Zoo
Multilingual/multidomain question generation datasets, models, and python library for question generation.
Cybertron: the home planet of the Transformers in Go
MinT: Minimal Transformer Library and Tutorials
Build and train state-of-the-art natural language processing models using BERT
code for AAAI2022 paper "Open Vocabulary Electroencephalography-To-Text Decoding and Zero-shot Sentiment Classification"
Calculate perplexity on a text with pre-trained language models. Support MLM (eg. DeBERTa), recurrent LM (eg. GPT3), and encoder-decoder LM (eg. Flan-T5).
Code associated with the "Data Augmentation using Pre-trained Transformer Models" paper
Automated Categorization: Utilizing the power of neural networks, this project offers an automated solution to categorize bank descriptions, reducing manual effort and enhancing efficiency while maintaining privacy.
BARTpho: Pre-trained Sequence-to-Sequence Models for Vietnamese (INTERSPEECH 2022)
Abstractive and Extractive Text summarization using Transformers.
NAACL 2021 - Progressive Generation of Long Text
Official implementation of the paper "IteraTeR: Understanding Iterative Revision from Human-Written Text" (ACL 2022)
Official implementation of the paper "IteraTeR: Understanding Iterative Revision from Human-Written Text" (ACL 2022)