Repository navigation

#

generative-pre-trained-transformer

HELM-GPT: de novo macrocyclic peptide design using generative pre-trained transformer

Jupyter Notebook
23
10 个月前

A custom GPT based on [Zero To Hero](https://karpathy.ai/zero-to-hero.html) utilizing tiktoken with the intent to augment AI Transformer-model education and reverse engineer GPT models from scratch.

Python
22
2 年前

Drawing inspiration from Andrej Karpathy’s iconic lecture, "Let’s Build GPT: From Scratch, in Code, Spelled Out", this project takes you on an immersive journey into the inner workings of GPT. Step-by-step, we’ll construct a GPT model from the ground up, demystifying its architecture and bringing its mechanics to life through hands-on coding.

Jupyter Notebook
14
5 个月前

ToyGPT, inspired by Andrej Karpathy’s GPT from scratch, creates a toy generative pre-trained transformer at its most basic level using a simple bigram language model with attention to help educate on the basics of creating an LLM from scratch.

Jupyter Notebook
2
5 个月前
Jupyter Notebook
1
2 年前

(GPT-1) | Generative Pre-trained Transformer - 1

0
2 年前

Repository for all things Natural Language Processing

Jupyter Notebook
0
2 年前

I built a GPT model from scratch to generate text

Jupyter Notebook
0
1 年前

A Generatively Pretrained Transformer that generates Shakespeare-eque quotes

Jupyter Notebook
0
2 年前