Repository navigation
generative-pre-trained-transformer
- Website
- Wikipedia
PhoGPT: Generative Pre-training for Vietnamese (2023)
An autoregressive language model like ChatGPT.
HELM-GPT: de novo macrocyclic peptide design using generative pre-trained transformer
A custom GPT based on [Zero To Hero](https://karpathy.ai/zero-to-hero.html) utilizing tiktoken with the intent to augment AI Transformer-model education and reverse engineer GPT models from scratch.
Drawing inspiration from Andrej Karpathy’s iconic lecture, "Let’s Build GPT: From Scratch, in Code, Spelled Out", this project takes you on an immersive journey into the inner workings of GPT. Step-by-step, we’ll construct a GPT model from the ground up, demystifying its architecture and bringing its mechanics to life through hands-on coding.
Simple GPT app that uses the falcon-7b-instruct model with a Flask front-end.
ToyGPT, inspired by Andrej Karpathy’s GPT from scratch, creates a toy generative pre-trained transformer at its most basic level using a simple bigram language model with attention to help educate on the basics of creating an LLM from scratch.
An Industrial Project about NLP in Finance Application
Repository for personal experiments
(GPT-1) | Generative Pre-trained Transformer - 1
Repository for all things Natural Language Processing
PyTorch implementation of GPT from scratch
I built a GPT model from scratch to generate text
A Generatively Pretrained Transformer that generates Shakespeare-eque quotes